Goto

Collaborating Authors

 factor graph neural network


FactorGraphNeuralNetwork

Neural Information Processing Systems

Most of the successful deep neural network architectures are structured, often consisting of elements like convolutional neural networks and gated recurrent neural networks. Recently, graph neural networks (GNNs) have been successfully applied to graph-structureddata such as point cloud and molecular data. These networks often only consider pairwise dependencies, as they operate on a graph structure.


Factor Graph Neural Networks

Neural Information Processing Systems

Most of the successful deep neural network architectures are structured, often consisting of elements like convolutional neural networks and gated recurrent neural networks. Recently, graph neural networks (GNNs) have been successfully applied to graph-structured data such as point cloud and molecular data. These networks often only consider pairwise dependencies, as they operate on a graph structure. We generalize the GNN into a factor graph neural network (FGNN) providing a simple way to incorporate dependencies among multiple variables. We show that FGNN is able to represent Max-Product belief propagation, an approximate inference method on probabilistic graphical models, providing a theoretical understanding on the capabilities of FGNN and related GNNs. Experiments on synthetic and real datasets demonstrate the potential of the proposed architecture.



Review for NeurIPS paper: Factor Graph Neural Networks

Neural Information Processing Systems

Weaknesses: The proposed architecture is not particularly novel and experiments can be improved. While the theoretical analysis is quite interesting, it is not significant enough to bypass the aforementioned issues (e.g., the analysis mainly relies on the Lemma 1 proposed by Kohli et al.). While the proposed factor graph neural network (FGNN) is guaranteed to express a family of higher-order interactions, in the end, FGNN is a member of MPNN applied to heterogeneous graph with two types of vertices (random variable and factor). I also think the considered experiments are limited since they only consider the case where (1) training and evaluation are done on the same graph and (2) factors are easily expressed as a representation of fixed dimension. In other words, the considered experiments are not very convincing for showing that the proposed FGNN works across general graphical models.


Review for NeurIPS paper: Factor Graph Neural Networks

Neural Information Processing Systems

This paper generalizes graph neural networks to factor graphs, and shows that the max-product algorithm is a special case (though there can be exponential number of rank-1 tensors). Experiments synthetic data, LDPC decoding, and human motion prediction are presented. The major concerns were addressed in the rebuttal, which provides additional clarifications and evidence. The authors should include them in the final version of the paper.


Factor Graph Neural Networks

Neural Information Processing Systems

Most of the successful deep neural network architectures are structured, often consisting of elements like convolutional neural networks and gated recurrent neural networks. Recently, graph neural networks (GNNs) have been successfully applied to graph-structured data such as point cloud and molecular data. These networks often only consider pairwise dependencies, as they operate on a graph structure. We generalize the GNN into a factor graph neural network (FGNN) providing a simple way to incorporate dependencies among multiple variables. We show that FGNN is able to represent Max-Product belief propagation, an approximate inference method on probabilistic graphical models, providing a theoretical understanding on the capabilities of FGNN and related GNNs.


Factor Graph Neural Networks

Zhang, Zhen, Dupty, Mohammed Haroon, Wu, Fan, Shi, Javen Qinfeng, Lee, Wee Sun

arXiv.org Artificial Intelligence

In recent years, we have witnessed a surge of Graph Neural Networks (GNNs), most of which can learn powerful representations in an end-to-end fashion with great success in many real-world applications. They have resemblance to Probabilistic Graphical Models (PGMs), but break free from some limitations of PGMs. By aiming to provide expressive methods for representation learning instead of computing marginals or most likely configurations, GNNs provide flexibility in the choice of information flowing rules while maintaining good performance. Despite their success and inspirations, they lack efficient ways to represent and learn higher-order relations among variables/nodes. More expressive higher-order GNNs which operate on k-tuples of nodes need increased computational resources in order to process higher-order tensors. We propose Factor Graph Neural Networks (FGNNs) to effectively capture higher-order relations for inference and learning. To do so, we first derive an efficient approximate Sum-Product loopy belief propagation inference algorithm for discrete higher-order PGMs. We then neuralize the novel message passing scheme into a Factor Graph Neural Network (FGNN) module by allowing richer representations of the message update rules; this facilitates both efficient inference and powerful end-to-end learning. We further show that with a suitable choice of message aggregation operators, our FGNN is also able to represent Max-Product belief propagation, providing a single family of architecture that can represent both Max and Sum-Product loopy belief propagation. Our extensive experimental evaluation on synthetic as well as real datasets demonstrates the potential of the proposed model.


Factor Graph Neural Network

Zhang, Zhen, Wu, Fan, Lee, Wee Sun

arXiv.org Machine Learning

Most of the successful deep neural network architectures are structured, often consisting of elements like convolutional neural networks and gated recurrent neural networks. Recently, graph neural networks have been successfully applied to graph structured data such as point cloud and molecular data. These networks often only consider pairwise dependencies, as they operate on a graph structure. We generalize the graph neural network into a factor graph neural network (FGNN) in order to capture higher order dependencies. We show that FGNN is able to represent Max-Product Belief Propagation, an approximate inference algorithm on probabilistic graphical models; hence it is able to do well when Max-Product does well. Promising results on both synthetic and real datasets demonstrate the effectiveness of the proposed model.